AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Parameter-shared architecture

# Parameter-shared architecture

Albert Xxlarge V2
Apache-2.0
ALBERT XXLarge v2 is a large language model pre-trained with masked language modeling objectives, featuring a parameter-shared Transformer architecture with 12 repeated layers and 223 million parameters.
Large Language Model English
A
albert
19.79k
20
Albert Xlarge V2
Apache-2.0
ALBERT XLarge v2 is an English pretrained model based on the Transformer architecture, employing parameter-sharing mechanisms to reduce memory usage, trained with masked language modeling and sentence order prediction objectives.
Large Language Model Transformers English
A
albert
2,195
11
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase